Confusion of Tagged Perturbations in Forward Automatic Differentiation of Higher-Order Functions
نویسندگان
چکیده
Forward Automatic Differentiation (AD) is a technique for augmenting programs to both perform their original calculation and also compute its directional derivative. The essence of Forward AD is to attach a derivative value to each number, and propagate these through the computation. When derivatives are nested, the distinct derivative calculations, and their associated attached values, must be distinguished. In dynamic languages this is typically accomplished by creating a unique tag for each application of the derivative operator, tagging the attached values, and overloading the arithmetic operators. We exhibit a subtle bug, present in fielded implementations, in which perturbations are confused despite the tagging machinery. 1 Forward AD using Tagged Tangents Forward AD (Wengert, 1964) computes the derivative of a function f : R → α at a point c by evaluating f(c+ ε) under a nonstandard interpretation that associates a conceptually infinitesimal perturbation with each real number, propagates these augmented values according to the rules of calculus (Leibniz, 1664), and extracts the perturbation of the result. When x is a number, we use x+ x̄ε to denote a tangent-vector bundle: the primal value x bundled with the tangent value x̄, where x̄ has the same type as x. We consider this tangent-vector bundle to also be a number, with arithmetic defined by regarding it as a truncated power series, or equivalently, by taking ε = 0 but ε 6= 0. This implies that f(x + x̄ε) = f(x) + x̄f (x)ε where f (x) is the first derivative of f at x (Newton, 1704). Hamilton Inst & Dept Comp Sci, NUI Maynooth, Co. Kildare, Ireland School of Electrical and Computer Engineering, Purdue University, West Lafayette IN 47907-2035, USA
منابع مشابه
Nesting forward-mode AD in a functional framework
We discuss the implications of the desire to augment a functionalprogramming language with a derivative-taking operator using forward-mode automatic differentiation (AD). The primary technical difficulty in doing so lies in ensuring correctness in the face of nested invocation of that operator, due to the need to distinguish perturbations introduced by distinct invocations. We exhibit a series ...
متن کاملCortical Activity During Postural Recovery in Response to Predictable and Unpredictable Perturbations in Healthy Young and Older Adults: A Quantitative EEG Assessment
Introduction: To investigate the effects of predictable and unpredictable external perturbations on cortical activity in healthy young and older adults. Methods: Twenty healthy older and 19 healthy young adults were exposed to predictable and unpredictable external perturbations, and their cortical activity upon postural recovery was measured using a 32-channel quantitative encephalography. Th...
متن کاملIntroduction to Automatic Differentiation and MATLAB Object-Oriented Programming
An introduction to both automatic differentiation and object-oriented programming can enrich a numerical analysis course that typically incorporates numerical differentiation and basic MATLAB computation. Automatic differentiation consists of exact algorithms on floating-point arguments. This implementation overloads standard elementary operators and functions in MATLAB with a derivative rule i...
متن کاملGenetic Diversity of Iranian and Some of European Grapesrevealed by Microsatellite Markers
In order to characterize Iranian grape (Vitis vinifera L.) germplasm, 136 genotypes were collected from five grape growing regions (Azarbaijan, Qazvin, Kordestan, Khorasan and Fars) and genotyped along with 36 European cultivars using 9 sequence tagged microsatellite sites (STMS) markers. The used set of markers could distinguish all 172 genotypes under study. Altogether 84 polymorphic alleles ...
متن کاملThe Principles of First Order Automatic Differentiation
This article provides a short overview of the theory of First Order Automatic Differentiation (AD) for readers unfamiliar with this topic. In particular, we summarize different characterisations of Forward AD, like the vector-matrix based approach, the idea of lifting functions to the algebra of dual numbers, the method of Taylor series expansion on dual numbers and the application of the push-...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1211.4892 شماره
صفحات -
تاریخ انتشار 2012